IMPROVING COVERAGE AND NOVELTY OF ABSTRACTIVE TEXT SUMMARIZATION USING TRANSFER LEARNING AND DIVIDE AND CONQUER APPROACHES

نویسندگان

چکیده

Automatic Text Summarization (ATS) models yield outcomes with insufficient coverage of crucial details and poor degrees novelty. The first issue resulted from the lengthy input, while second problem characteristics training dataset itself. This research employs divide-and-conquer approach to address by breaking input into smaller pieces be summarized, followed conquest results in order cover more significant details. For challenge, these chunks are summarized trained on datasets higher novelty levels produce human-like concise summaries novel words that do not appear article. demonstrate an improvement both levels. Moreover, we defined a new metric measure summary. Finally, investigated findings discover whether is influenced itself, as CNN/DM, or model its objective, Pegasus.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

the study of practical and theoretical foundation of credit risk and its coverage

پس از بررسی هر کدام از فاکتورهای نوع صنعت, نوع ضمانت نامه, نرخ بهره , نرخ تورم, ریسک اعتباری کشورها, کارمزد, ریکاوری, gdp, پوشش و وثیقه بر ریسک اعتباری صندوق ضمانت صادرات ایران مشخص گردید که همه فاکتورها به استثنای ریسک اعتباری کشورها و کارمزد بقیه فاکتورها رابطه معناداری با ریسک اعتباری دارند در ضمن نرخ بهره , نرخ تورم, ریکاوری, و نوع صنعت و ریسک کشورها اثر عکس روی ریسک اعتباری داردو پوشش, وثی...

15 صفحه اول

the relationship between using language learning strategies, learners’ optimism, educational status, duration of learning and demotivation

with the growth of more humanistic approaches towards teaching foreign languages, more emphasis has been put on learners’ feelings, emotions and individual differences. one of the issues in teaching and learning english as a foreign language is demotivation. the purpose of this study was to investigate the relationship between the components of language learning strategies, optimism, duration o...

15 صفحه اول

Neural Abstractive Text Summarization

Abstractive text summarization is a complex task whose goal is to generate a concise version of a text without necessarily reusing the sentences from the original source, but still preserving the meaning and the key contents. We address this issue by modeling the problem as a sequence to sequence learning and exploiting Recurrent Neural Networks (RNNs). This work is a discussion about our ongoi...

متن کامل

TL;DR: Improving Abstractive Summarization Using LSTMs

Traditionally, summarization has been approached through extractive methods. However, they have produced limited results. More recently, neural sequence-tosequence models for abstractive text summarization have shown more promise, although the task still proves to be challenging. In this paper, we explore current state-of-the-art architectures and reimplement them from scratch. We begin with a ...

متن کامل

Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond

In this work, we model abstractive text summarization using Attentional EncoderDecoder Recurrent Neural Networks, and show that they achieve state-of-the-art performance on two different corpora. We propose several novel models that address critical problems in summarization that are not adequately modeled by the basic architecture, such as modeling key-words, capturing the hierarchy of sentenc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Malaysian Journal of Computer Science

سال: 2023

ISSN: ['0127-9084']

DOI: https://doi.org/10.22452/mjcs.vol36no3.4